Goto

Collaborating Authors

 handwritten digit recognition


Devanagari Digit Recognition using Quantum Machine Learning

Malla, Sahaj Raj

arXiv.org Artificial Intelligence

Handwritten digit recognition in regional scripts, such as Devanagari, is crucial for multilingual document digitization, educational tools, and the preservation of cultural heritage. The script's complex structure and limited annotated datasets pose significant challenges to conventional models. This paper introduces the first hybrid quantum-classical architecture for Devanagari handwritten digit recognition, combining a convolu-tional neural network (CNN) for spatial feature extraction with a 10-qubit variational quantum circuit (VQC) for quantum-enhanced classification. Trained and evaluated on the Devanagari Handwritten Character Dataset (DHCD), the proposed model achieves a state-of-the-art test accuracy for quantum implementation of 99.80% and a test loss of 0.2893, with an average per-class F1-score of 0.9980. Compared to equivalent classical CNNs, our model demonstrates superior accuracy with significantly fewer parameters and enhanced robustness. By leveraging quantum principles such as superposition and entanglement, this work establishes a novel benchmark for regional script recognition, highlighting the promise of quantum machine learning (QML) in real-world, low-resource language settings.


Handwritten Digit Recognition with a Back-Propagation Network

Neural Information Processing Systems

We present an application of back-propagation networks to hand(cid:173) written digit recognition. Minimal preprocessing of the data was required, but architecture of the network was highly constrained and specifically designed for the task. The input of the network consists of normalized images of isolated digits. The method has 1 % error rate and about a 9% reject rate on zipcode digits provided by the U.S. Postal Service.


Selective Attention for Handwritten Digit Recognition

Neural Information Processing Systems

Completely parallel object recognition is NP-complete. Achieving a recognizer with feasible complexity requires a compromise be(cid:173) tween parallel and sequential processing where a system selectively focuses on parts of a given image, one after another. Successive fixations are generated to sample the image and these samples are processed and abstracted to generate a temporal context in which results are integrated over time. A computational model based on a partially recurrent feedforward network is proposed and made cred(cid:173) ible by testing on the real-world problem of recognition of hand(cid:173) written digits with encouraging results.


Handwritten Digit Recognition using Convolutional Neural Network (CNN) with Tensorflow

#artificialintelligence

Handwritten Digit Recognition is the process of digitizing human handwritten digit images. It is a difficult task for the machine because handwritten digits are not perfect and can be made with a variety of flavors. In order to address this issue, we created HDR, which uses the image of a digit to identify the digit that is present in the image. In this project, we developed a Convolutional Neural Network (CNN) model using the Tensorflow framework to Recognition of Handwritten Digit. A convolutional neural network (CNN, or ConvNet) is a Deep Learning algorithm that can take in an input image, assign learnable weights and biases to various objects in the image and be able to distinguish one from the other.


The Best Machine Learning Algorithm for Handwritten Digits Recognition

#artificialintelligence

Handwritten Digit Recognition is an interesting machine learning problem in which we have to identify the handwritten digits through various classification algorithms. There are a number of ways and algorithms to recognize handwritten digits, including Deep Learning/CNN, SVM, Gaussian Naive Bayes, KNN, Decision Trees, Random Forests, etc. In this article, we will deploy a variety of machine learning algorithms from the Sklearn's library on our dataset to classify the digits into their categories. We will use Sklearn's load_digits dataset, which is a collection of 8x8 images (64 features)of digits. The dataset contains a total of 1797 sample points.


Researchers rebuild the bridge between neuroscience and artificial intelligence

#artificialintelligence

The origin of machine and deep learning algorithms, which increasingly affect almost all aspects of our life, is the learning mechanism of synaptic (weight) strengths connecting neurons in our brain. Attempting to imitate these brain functions, researchers bridged between neuroscience and artificial intelligence over half a century ago. However, since then experimental neuroscience has not directly advanced the field of machine learning and both disciplines -- neuroscience and machine learning -- seem to have developed independently. In an article published today in the journal Scientific Reports, researchers reveal that they have successfully rebuilt the bridge between experimental neuroscience and advanced artificial intelligence learning algorithms. Conducting new types of experiments on neuronal cultures, the researchers were able to demonstrate a new accelerated brain-inspired learning mechanism.


Handwritten Digit Recognition Using Keras - Intro To Artificial Neural Networks

#artificialintelligence

Over the last decade, the use of artificial neural networks (ANNs) has increased considerably. With all the buzz about deep learning and artificial neural networks, haven't you always wanted to create one for yourself? In this Keras tutorial, we'll create a model to recognize handwritten digits. We use the keras library for training the model in this tutorial. Keras is a high-level library in Python that is a wrapper over TensorFlow, CNTK and Theano.


Neuro-memristive Circuits for Edge Computing: A review

Krestinskaya, Olga, James, Alex Pappachen, Chua, Leon O.

arXiv.org Artificial Intelligence

The volume, veracity, variability and velocity of data produced from the ever increasing network of sensors connected to Internet pose challenges for power management, scalability and sustainability of cloud computing infrastructure. Increasing the data processing capability of edge computing devices at lower power requirements can reduce the overheads for cloud computing solutions. This paper provides the review of neuromorphic CMOS-memristive architectures that can be integrated into edge computing devices. We discuss why the neuromorphic architectures are useful for edge devices and show the advantages, drawbacks and open problems in the field of memristive circuit and architectures in terms of edge computing perspective.


Computational Eco-Systems for Handwritten Digits Recognition

Loquercio, Antonio, Della Torre, Francesca, Buscema, Massimo

arXiv.org Machine Learning

Inspired by the importance of diversity in biological system, we built an heterogeneous system that could achieve this goal. Our architecture could be summarized in two basic steps. First, we generate a diverse set of classification hypothesis using both Convolutional Neural Networks, currently the state-of-the-art technique for this task, among with other traditional and innovative machine learning techniques. Then, we optimally combine them through Meta-Nets, a family of recently developed and performing ensemble methods.